Averaging Regularized Estimators

نویسندگان

  • Michiaki Taniguchi
  • Volker Tresp
چکیده

We compare the performance of averaged regularized estimators. We show that the improvement in performance which can be achieved by averaging depends critically on the degree of regularization which is used in training the individual estimators. We compare four different averaging approaches: simple averaging, bagging, variance-based weighting and variance-based bagging. In any of the averaging methods the greatest degree of improvement —if compared to the individual estimators— is achieved if no or only a small degree of regularization is used. Here, variance-based weighting and variance-based bagging are superior to simple averaging or bagging. Our experiments indicate that better performance for both individual estimators and for averaging is achieved in combination with regularization. With increasing degrees of regularization, the two bagging-based approaches (bagging, variance-based bagging) outperform the individual estimators, simple averaging, as well as variance-based weighting. Bagging and variance-based bagging seem to be the overall best combining methods over a wide range of degrees of regularization. ∗e-mail: [email protected], [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularization of Wavelet Approximations

In this paper, we introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced. The approach can apply readily to many other statistical contexts. Various new penalty functions are proposed. The hard-thresholding and soft-thresholding estimators of Donoho and Johnstone are speciŽ c members of nonlinear regular...

متن کامل

Regularization of Wavelets Approximations

In this paper, we introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced. The approach can apply readily to many other statistical contexts. Various new penalty functions are proposed. The hard-thresholding and soft-thresholding estimators of Donoho and Johnstone (1994) are speci c members of nonlinear r...

متن کامل

An adaptive method for combined covariance estimation and classification

In this paper a family of adaptive covariance estimators is proposed to mitigate the problem of limited training samples for application to hyperspectral data analysis in quadratic maximum likelihood classification. These estimators are the combination of adaptive classification procedures and regularized covariance estimators. In these proposed estimators, the semi-labeled samples (whose label...

متن کامل

Sparsistency of 1-Regularized M-Estimators

We consider the model selection consistency or sparsistency of a broad set of ` 1 regularized M -estimators for linear and nonlinear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic su cient conditions for sparsistency in terms of the regularization parame...

متن کامل

Pii: S0079-6565(00)00032-7

1. Introduction and historical remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160 2. Spectral estimators, parameter estimators and nonlinear problems . . . . . . . . . . . . . . . . . . . . . . . . . . . 164 3. Non-Hermitian quantum mechanics: connection to the harmonic inversion problem (HIP) . . . . . . . . . 167 4. HIP can be solve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Computation

دوره 9  شماره 

صفحات  -

تاریخ انتشار 1997